Fast and Correct Gradient-Based Optimisation for Probabilistic Programming via Smoothing

نویسندگان

چکیده

Abstract We study the foundations of variational inference, which frames posterior inference as an optimisation problem, for probabilistic programming. The dominant approach in practice is stochastic gradient descent. In particular, a variant using so-called reparameterisation estimator exhibits fast convergence traditional statistics setting. Unfortunately, discontinuities, are readily expressible programming languages, can compromise correctness this approach. consider simple (higher-order, probabilistic) language with conditionals, and we endow our both measurable smoothed (approximate) value semantics. present type systems establish technical pre-conditions. Thus prove descent to be correct when applied problem. Besides, solve original problem up any error tolerance by choosing accuracy coefficient suitably. Empirically demonstrate that has similar key competitor, but simpler, faster, attains orders magnitude reduction work-normalised variance.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Image smoothing via L0 gradient minimization

We present a new image editing method, particularly effective for sharpening major edges by increasing the steepness of transition while eliminating a manageable degree of low-amplitude structures. The seemingly contradictive effect is achieved in an optimization framework making use of L0 gradient minimization, which can globally control how many non-zero gradients are resulted in to approxima...

متن کامل

Smoothing a Probabilistic Lexicon via Syntactic Transformations

SMOOTHING A PROBABILISTIC LEXICON VIA SYNTACTIC TRANSFORMATIONS Jason Michael Eisner Supervisor: Professor Mitch Marcus Probabilistic parsing requires a lexicon that specifies each word’s syntactic preferences in terms of probabilities. To estimate these probabilities for words that were poorly observed during training, this thesis assumes the existence of arbitrarily powerful transformations (...

متن کامل

Scalable Correct Memory Ordering via Relativistic Programming

We propose and document a new concurrent programming model, relativistic programming. This model allows readers to run concurrently with writers, without blocking or using expensive synchronization. Relativistic programming builds on existing synchronization primitives that allow writers to wait for current readers to finish with minimal reader overhead. Our methodology models data structures a...

متن کامل

Extending Particle Swarm Optimisation via Genetic Programming

Particle Swarm Optimisers (PSOs) search using a set of interacting particles flying over the fitness landscape. These are typically controlled by forces that encourage each particle to fly back both towards the best point sampled by it and towards the swarm’s best. Here we explore the possibility of evolving optimal force generating equations to control the particles in a PSO using genetic prog...

متن کامل

A gradient-based optimisation scheme foroptical tomography.

Optical tomography schemes using non-linear optimisation are usually based on a Newton-like method involving the construction and inversion of a large Jacobian matrix. Although such matrices canbe efficiently constructed using a reciprocity principle, their inversion is still computationally diffcult. In this paper we demonstrate a simple means to obtain the gradient of the objective function d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Lecture Notes in Computer Science

سال: 2023

ISSN: ['1611-3349', '0302-9743']

DOI: https://doi.org/10.1007/978-3-031-30044-8_18